2,243 research outputs found

    An optimization model for the US Air-Traffic System

    Get PDF
    A systematic approach for monitoring U.S. air traffic was developed in the context of system-wide planning and control. Towards this end, a network optimization model with nonlinear objectives was chosen as the central element in the planning/control system. The network representation was selected because: (1) it provides a comprehensive structure for depicting essential aspects of the air traffic system, (2) it can be solved efficiently for large scale problems, and (3) the design can be easily communicated to non-technical users through computer graphics. Briefly, the network planning models consider the flow of traffic through a graph as the basic structure. Nodes depict locations and time periods for either individual planes or for aggregated groups of airplanes. Arcs define variables as actual airplanes flying through space or as delays across time periods. As such, a special case of the network can be used to model the so called flow control problem. Due to the large number of interacting variables and the difficulty in subdividing the problem into relatively independent subproblems, an integrated model was designed which will depict the entire high level (above 29000 feet) jet route system for the 48 contiguous states in the U.S. As a first step in demonstrating the concept's feasibility a nonlinear risk/cost model was developed for the Indianapolis Airspace. The nonlinear network program --NLPNETG-- was employed in solving the resulting test cases. This optimization program uses the Truncated-Newton method (quadratic approximation) for determining the search direction at each iteration in the nonlinear algorithm. It was shown that aircraft could be re-routed in an optimal fashion whenever traffic congestion increased beyond an acceptable level, as measured by the nonlinear risk function

    Integrated risk/cost planning models for the US Air Traffic system

    Get PDF
    A prototype network planning model for the U.S. Air Traffic control system is described. The model encompasses the dual objectives of managing collision risks and transportation costs where traffic flows can be related to these objectives. The underlying structure is a network graph with nonseparable convex costs; the model is solved efficiently by capitalizing on its intrinsic characteristics. Two specialized algorithms for solving the resulting problems are described: (1) truncated Newton, and (2) simplicial decomposition. The feasibility of the approach is demonstrated using data collected from a control center in the Midwest. Computational results with different computer systems are presented, including a vector supercomputer (CRAY-XMP). The risk/cost model has two primary uses: (1) as a strategic planning tool using aggregate flight information, and (2) as an integrated operational system for forecasting congestion and monitoring (controlling) flow throughout the U.S. In the latter case, access to a supercomputer is required due to the model's enormous size

    Developing a New Testing Paradigm for Risk-Assessment of Bee-Pesticide Interactions – Quantifying the Pace of Neonicotinoid Toxicokinetics

    Get PDF
    Neonicotinoid pesticides, which are used to protect crops from certain pests, have been correlated with the decline of non-target insect species, including bumblebees. However, despite a myriad of studies into the interaction and impact of neonicotinoids, uncertainty remains as to the risks these xenobiotics pose to bees. In particular, the question of bioaccumulation, defined here as how long neonicotinoids persist in the body (i.e. fast or slow toxicokinetics) has not yet been determined for neonicotinoids and bumblebee species. Moreover, while the implications of bioaccumulation on non-target species are severe, regulatory standards continue to rely on acute paradigm testing (e.g. 48-hour LC50s or NOECs) that may inherently fail to capture bioaccumulation. First, I reviewed the literature on the pace of toxicokinetics for neonicotinoids, found in studies on enzymatic metabolism and receptor site bonding of these substances, which are the main pathways for clearance of xenobiotics. The literature supports that neonicotinoids have face-paced toxicokinetics and are unlikely to bioaccumulate in bees. I further reviewed current regulatory practices (LC50s and NOECs), and how a proxy for bioaccumulation can be derived using dose-dependence studies analysed with Haber’s Law. Next, I conducted laboratory experiments examining the usefulness of Haber’s Law for quantifying bioaccumulation using the neonicotinoids imidacloprid and thiamethoxam, and the known bioaccumulative phenylpyrazole, fipronil, as a positive control. Here, not only did I corroborate the literature review findings that neonicotinoids likely have face-paced toxicokinetics, I found evidence that fipronil has bioaccumulative properties, which underscores the usefulness of Haber’s Law in regulatory testing for bioaccumulation. Finally, I used 96-hour pulse-exposures to assess a proxy for toxicokinetic pace. Bees with pulsed exposures should have less injury than constant exposures if pesticides are easily cleared. Again, thiamethoxam and fipronil showed signs of differing toxicokinetic pace. These quantifiers could be used to fill a regulatory gap for bioaccumulation addressing toxicokinetic pace

    The Investigation of Snowfall Rate Using Optical Techniques

    Get PDF
    Experiments involving the attenuation of light by falling snow, or hydrometeors in general, lend themselves to the study of at least two problems of meteorological interest. The first problem is that of visibility in adverse weather conditions, and the second is that of finding a better method to measure precipitation. From the late 1940\u27s through the late 1960\u27s, various investigators have attempted to apply forward light scattering theory in the geometric optics range (α ≥ 200, where α = (2π_r)/λ, λ=wavelength and r = radius)* to quantitatively describe the attenuation of a light beam, and thereby visibility changes due to hydrometeors

    A globalisation of the Gelfand duality theorem

    Get PDF
    AbstractIn this paper we bring together results from a series of previous papers to prove the constructive version of the Gelfand duality theorem in any Grothendieck topos E, obtaining a dual equivalence between the category of commutative C∗-algebras and the category of compact, completely regular locales in the topos E

    A Model for Classical Space-time Co-ordinates

    Get PDF
    Field equations with general covariance are interpreted as equations for a target space describing physical space time co-ordinates, in terms of an underlying base space with conformal invariance. These equations admit an infinite number of inequivalent Lagrangian descriptions. A model for reparametrisation invariant membranes is obtained by reversing the roles of base and target space variables in these considerations.Comment: 9 pages, Latex. This was the basis of a talk given at the Argonne National Laboratory 1996 Summer Institute : Topics on Non-Abelian Duality June 27-July 1

    Properties of the Scalar Universal Equations

    Full text link
    The variational properties of the scalar so--called ``Universal'' equations are reviewed and generalised. In particular, we note that contrary to earlier claims, each member of the Euler hierarchy may have an explicit field dependence. The Euler hierarchy itself is given a new interpretation in terms of the formal complex of variational calculus, and is shown to be related to the algebra of distinguished symmetries of the first source form.Comment: 15 pages, LaTeX articl
    • …
    corecore